# Academic benchmarking
Orca Mini V5 8b Dpo
An 8B parameter model based on the Llama 3 architecture, trained with various DPO datasets, focused on text generation tasks
Large Language Model
Transformers English

O
pankajmathur
16
3
Tinymistral 248M V3
Apache-2.0
TinyMistral-248M-v3 is a small language model with 248M parameters, currently still in training, having processed approximately 21 billion tokens.
Large Language Model
Transformers English

T
M4-ai
179
8
Featured Recommended AI Models